MiniCPM is an edge-side large language model series jointly developed by FaceWall Intelligence and Tsinghua University's Natural Language Processing Laboratory. The core model contains only 1.2 billion non-embedding parameters and outperforms larger open-source models in multiple evaluations.
Large Language Model
Transformers Supports Multiple Languages